Console Output

Training and evaluating model for: Laptop
Dataset length: 16183 windows


 NILMModel(
  (conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
  (lstm): LSTM(9, 256, num_layers=5, batch_first=True, dropout=0.1)
  (dropout): Dropout(p=0.1, inplace=False)
  (relu): ReLU()
  (output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.003350
Validation Loss: 0.003748
Epoch [2/300], Train Loss: 0.003107
Validation Loss: 0.003733
Epoch [3/300], Train Loss: 0.003209
Validation Loss: 0.003725
Epoch [4/300], Train Loss: 0.003104
Validation Loss: 0.003586
Epoch [5/300], Train Loss: 0.003022
Validation Loss: 0.003584
Epoch [6/300], Train Loss: 0.003048
Validation Loss: 0.003539
Epoch [7/300], Train Loss: 0.003073
Validation Loss: 0.003548
Epoch [8/300], Train Loss: 0.003021
Validation Loss: 0.003616
Epoch [9/300], Train Loss: 0.003002
Validation Loss: 0.003570
Epoch [10/300], Train Loss: 0.003012
Validation Loss: 0.003525
Epoch [11/300], Train Loss: 0.003011
Validation Loss: 0.003583
Epoch [12/300], Train Loss: 0.003027
Validation Loss: 0.003530
Epoch [13/300], Train Loss: 0.002987
Validation Loss: 0.003550
Epoch [14/300], Train Loss: 0.002997
Validation Loss: 0.003496
Epoch [15/300], Train Loss: 0.002984
Validation Loss: 0.003515
Epoch [16/300], Train Loss: 0.003020
Validation Loss: 0.003546
Epoch [17/300], Train Loss: 0.002996
Validation Loss: 0.003525
Epoch [18/300], Train Loss: 0.002978
Validation Loss: 0.003535
Epoch [19/300], Train Loss: 0.002966
Validation Loss: 0.003486
Epoch [20/300], Train Loss: 0.002963
Validation Loss: 0.003461
Epoch [21/300], Train Loss: 0.002967
Validation Loss: 0.003453
Epoch [22/300], Train Loss: 0.002970
Validation Loss: 0.003543
Epoch [23/300], Train Loss: 0.002951
Validation Loss: 0.003401
Epoch [24/300], Train Loss: 0.002979
Validation Loss: 0.003423
Epoch [25/300], Train Loss: 0.002918
Validation Loss: 0.003480
Epoch [26/300], Train Loss: 0.002963
Validation Loss: 0.003413
Epoch [27/300], Train Loss: 0.002974
Validation Loss: 0.003484
Epoch [28/300], Train Loss: 0.002959
Validation Loss: 0.003399
Epoch [29/300], Train Loss: 0.002954
Validation Loss: 0.003345
Epoch [30/300], Train Loss: 0.002897
Validation Loss: 0.003467
Epoch [31/300], Train Loss: 0.002931
Validation Loss: 0.003326
Epoch [32/300], Train Loss: 0.002943
Validation Loss: 0.003359
Epoch [33/300], Train Loss: 0.002915
Validation Loss: 0.003367
Epoch [34/300], Train Loss: 0.002889
Validation Loss: 0.003309
Epoch [35/300], Train Loss: 0.002895
Validation Loss: 0.003312
Epoch [36/300], Train Loss: 0.002897
Validation Loss: 0.003314
Epoch [37/300], Train Loss: 0.002895
Validation Loss: 0.003486
Epoch [38/300], Train Loss: 0.002905
Validation Loss: 0.003279
Epoch [39/300], Train Loss: 0.002940
Validation Loss: 0.003436
Epoch [40/300], Train Loss: 0.002895
Validation Loss: 0.003270
Epoch [41/300], Train Loss: 0.002891
Validation Loss: 0.003382
Epoch [42/300], Train Loss: 0.002941
Validation Loss: 0.003293
Epoch [43/300], Train Loss: 0.002869
Validation Loss: 0.003290
Epoch [44/300], Train Loss: 0.002925
Validation Loss: 0.003281
Epoch [45/300], Train Loss: 0.002900
Validation Loss: 0.003284
Epoch [46/300], Train Loss: 0.002915
Validation Loss: 0.003335
Epoch [47/300], Train Loss: 0.002906
Validation Loss: 0.003268
Epoch [48/300], Train Loss: 0.002862
Validation Loss: 0.003350
Epoch [49/300], Train Loss: 0.002885
Validation Loss: 0.003233
Epoch [50/300], Train Loss: 0.002846
Validation Loss: 0.003386
Epoch [51/300], Train Loss: 0.002854
Validation Loss: 0.003226
Epoch [52/300], Train Loss: 0.002870
Validation Loss: 0.003305
Epoch [53/300], Train Loss: 0.002872
Validation Loss: 0.003238
Epoch [54/300], Train Loss: 0.002880
Validation Loss: 0.003575
Epoch [55/300], Train Loss: 0.002866
Validation Loss: 0.003263
Epoch [56/300], Train Loss: 0.002843
Validation Loss: 0.003294
Epoch [57/300], Train Loss: 0.002895
Validation Loss: 0.003382
Epoch [58/300], Train Loss: 0.002834
Validation Loss: 0.003274
Epoch [59/300], Train Loss: 0.002830
Validation Loss: 0.003188
Epoch [60/300], Train Loss: 0.002894
Validation Loss: 0.003212
Epoch [61/300], Train Loss: 0.002827
Validation Loss: 0.003422
Epoch [62/300], Train Loss: 0.002861
Validation Loss: 0.003282
Epoch [63/300], Train Loss: 0.002812
Validation Loss: 0.003181
Epoch [64/300], Train Loss: 0.002837
Validation Loss: 0.003235
Epoch [65/300], Train Loss: 0.002793
Validation Loss: 0.003224
Epoch [66/300], Train Loss: 0.002787
Validation Loss: 0.003156
Epoch [67/300], Train Loss: 0.002803
Validation Loss: 0.003557
Epoch [68/300], Train Loss: 0.002831
Validation Loss: 0.003146
Epoch [69/300], Train Loss: 0.002824
Validation Loss: 0.003257
Epoch [70/300], Train Loss: 0.002808
Validation Loss: 0.003142
Epoch [71/300], Train Loss: 0.002783
Validation Loss: 0.003060
Epoch [72/300], Train Loss: 0.002732
Validation Loss: 0.002977
Epoch [73/300], Train Loss: 0.002673
Validation Loss: 0.002989
Epoch [74/300], Train Loss: 0.002645
Validation Loss: 0.002909
Epoch [75/300], Train Loss: 0.002626
Validation Loss: 0.002957
Epoch [76/300], Train Loss: 0.002625
Validation Loss: 0.002901
Epoch [77/300], Train Loss: 0.002654
Validation Loss: 0.002817
Epoch [78/300], Train Loss: 0.002595
Validation Loss: 0.002730
Epoch [79/300], Train Loss: 0.002531
Validation Loss: 0.002677
Epoch [80/300], Train Loss: 0.002450
Validation Loss: 0.002633
Epoch [81/300], Train Loss: 0.002445
Validation Loss: 0.002741
Epoch [82/300], Train Loss: 0.002365
Validation Loss: 0.002418
Epoch [83/300], Train Loss: 0.002251
Validation Loss: 0.002311
Epoch [84/300], Train Loss: 0.002219
Validation Loss: 0.002529
Epoch [85/300], Train Loss: 0.002179
Validation Loss: 0.002214
Epoch [86/300], Train Loss: 0.002118
Validation Loss: 0.002167
Epoch [87/300], Train Loss: 0.002073
Validation Loss: 0.002035
Epoch [88/300], Train Loss: 0.001997
Validation Loss: 0.001898
Epoch [89/300], Train Loss: 0.002094
Validation Loss: 0.001958
Epoch [90/300], Train Loss: 0.002029
Validation Loss: 0.001830
Epoch [91/300], Train Loss: 0.001610
Validation Loss: 0.001461
Epoch [92/300], Train Loss: 0.001418
Validation Loss: 0.001463
Epoch [93/300], Train Loss: 0.001189
Validation Loss: 0.001332
Epoch [94/300], Train Loss: 0.001066
Validation Loss: 0.000956
Epoch [95/300], Train Loss: 0.000954
Validation Loss: 0.001258
Epoch [96/300], Train Loss: 0.000868
Validation Loss: 0.000846
Epoch [97/300], Train Loss: 0.000805
Validation Loss: 0.000833
Epoch [98/300], Train Loss: 0.000758
Validation Loss: 0.000824
Epoch [99/300], Train Loss: 0.000703
Validation Loss: 0.000766
Epoch [100/300], Train Loss: 0.000677
Validation Loss: 0.000733
Epoch [101/300], Train Loss: 0.000643
Validation Loss: 0.000683
Epoch [102/300], Train Loss: 0.000620
Validation Loss: 0.000666
Epoch [103/300], Train Loss: 0.000580
Validation Loss: 0.000637
Epoch [104/300], Train Loss: 0.000615
Validation Loss: 0.000616
Epoch [105/300], Train Loss: 0.000585
Validation Loss: 0.000657
Epoch [106/300], Train Loss: 0.000520
Validation Loss: 0.000702
Epoch [107/300], Train Loss: 0.000541
Validation Loss: 0.000580
Epoch [108/300], Train Loss: 0.000546
Validation Loss: 0.000544
Epoch [109/300], Train Loss: 0.000482
Validation Loss: 0.000549
Epoch [110/300], Train Loss: 0.000474
Validation Loss: 0.000551
Epoch [111/300], Train Loss: 0.000481
Validation Loss: 0.000505
Epoch [112/300], Train Loss: 0.000454
Validation Loss: 0.000603
Epoch [113/300], Train Loss: 0.000450
Validation Loss: 0.000488
Epoch [114/300], Train Loss: 0.000491
Validation Loss: 0.000761
Epoch [115/300], Train Loss: 0.000463
Validation Loss: 0.000573
Epoch [116/300], Train Loss: 0.000467
Validation Loss: 0.000489
Epoch [117/300], Train Loss: 0.000431
Validation Loss: 0.000801
Epoch [118/300], Train Loss: 0.000429
Validation Loss: 0.000452
Epoch [119/300], Train Loss: 0.000457
Validation Loss: 0.000466
Epoch [120/300], Train Loss: 0.000418
Validation Loss: 0.000476
Epoch [121/300], Train Loss: 0.000481
Validation Loss: 0.000461
Epoch [122/300], Train Loss: 0.000429
Validation Loss: 0.000448
Epoch [123/300], Train Loss: 0.000391
Validation Loss: 0.000421
Epoch [124/300], Train Loss: 0.000399
Validation Loss: 0.000418
Epoch [125/300], Train Loss: 0.000434
Validation Loss: 0.000540
Epoch [126/300], Train Loss: 0.000393
Validation Loss: 0.000409
Epoch [127/300], Train Loss: 0.000388
Validation Loss: 0.000431
Epoch [128/300], Train Loss: 0.000385
Validation Loss: 0.000385
Epoch [129/300], Train Loss: 0.000399
Validation Loss: 0.000493
Epoch [130/300], Train Loss: 0.000430
Validation Loss: 0.000412
Epoch [131/300], Train Loss: 0.000353
Validation Loss: 0.000387
Epoch [132/300], Train Loss: 0.000356
Validation Loss: 0.000399
Epoch [133/300], Train Loss: 0.000397
Validation Loss: 0.000397
Epoch [134/300], Train Loss: 0.000421
Validation Loss: 0.000381
Epoch [135/300], Train Loss: 0.000378
Validation Loss: 0.000595
Epoch [136/300], Train Loss: 0.000396
Validation Loss: 0.000375
Epoch [137/300], Train Loss: 0.000358
Validation Loss: 0.000623
Epoch [138/300], Train Loss: 0.000414
Validation Loss: 0.000391
Epoch [139/300], Train Loss: 0.000332
Validation Loss: 0.000350
Epoch [140/300], Train Loss: 0.000341
Validation Loss: 0.000385
Epoch [141/300], Train Loss: 0.000362
Validation Loss: 0.000363
Epoch [142/300], Train Loss: 0.000385
Validation Loss: 0.000354
Epoch [143/300], Train Loss: 0.000339
Validation Loss: 0.000378
Epoch [144/300], Train Loss: 0.000357
Validation Loss: 0.000350
Epoch [145/300], Train Loss: 0.000368
Validation Loss: 0.000346
Epoch [146/300], Train Loss: 0.000335
Validation Loss: 0.000344
Epoch [147/300], Train Loss: 0.000338
Validation Loss: 0.000329
Epoch [148/300], Train Loss: 0.000336
Validation Loss: 0.001284
Epoch [149/300], Train Loss: 0.000465
Validation Loss: 0.000341
Epoch [150/300], Train Loss: 0.000348
Validation Loss: 0.000343
Epoch [151/300], Train Loss: 0.000297
Validation Loss: 0.000315
Epoch [152/300], Train Loss: 0.000328
Validation Loss: 0.000320
Epoch [153/300], Train Loss: 0.000364
Validation Loss: 0.000399
Epoch [154/300], Train Loss: 0.000317
Validation Loss: 0.000326
Epoch [155/300], Train Loss: 0.000332
Validation Loss: 0.000334
Epoch [156/300], Train Loss: 0.000316
Validation Loss: 0.000382
Epoch [157/300], Train Loss: 0.000332
Validation Loss: 0.000318
Epoch [158/300], Train Loss: 0.000312
Validation Loss: 0.000322
Epoch [159/300], Train Loss: 0.000312
Validation Loss: 0.000311
Epoch [160/300], Train Loss: 0.000382
Validation Loss: 0.000376
Epoch [161/300], Train Loss: 0.000374
Validation Loss: 0.000310
Epoch [162/300], Train Loss: 0.000309
Validation Loss: 0.000310
Epoch [163/300], Train Loss: 0.000308
Validation Loss: 0.000295
Epoch [164/300], Train Loss: 0.000377
Validation Loss: 0.000382
Epoch [165/300], Train Loss: 0.000330
Validation Loss: 0.000355
Epoch [166/300], Train Loss: 0.000324
Validation Loss: 0.000335
Epoch [167/300], Train Loss: 0.000268
Validation Loss: 0.000284
Epoch [168/300], Train Loss: 0.000363
Validation Loss: 0.000687
Epoch [169/300], Train Loss: 0.000410
Validation Loss: 0.000291
Epoch [170/300], Train Loss: 0.000304
Validation Loss: 0.000488
Epoch [171/300], Train Loss: 0.000309
Validation Loss: 0.000285
Epoch [172/300], Train Loss: 0.000257
Validation Loss: 0.000278
Epoch [173/300], Train Loss: 0.000358
Validation Loss: 0.000288
Epoch [174/300], Train Loss: 0.000351
Validation Loss: 0.000320
Epoch [175/300], Train Loss: 0.000270
Validation Loss: 0.000307
Epoch [176/300], Train Loss: 0.000378
Validation Loss: 0.000329
Epoch [177/300], Train Loss: 0.000268
Validation Loss: 0.000297
Epoch [178/300], Train Loss: 0.001185
Validation Loss: 0.001254
Epoch [179/300], Train Loss: 0.000773
Validation Loss: 0.000630
Epoch [180/300], Train Loss: 0.000543
Validation Loss: 0.000619
Epoch [181/300], Train Loss: 0.000486
Validation Loss: 0.000466
Epoch [182/300], Train Loss: 0.000434
Validation Loss: 0.000417
Early stopping triggered

Evaluating model for: Laptop
Validation MAE: 0.562455 W
Validation MSE: 2.969129 W²
Validation RMSE: 1.723116 W
Signal Aggregate Error (SAE): 0.001948
Normalized Disaggregation Error (NDE): 0.199861

      

Training and Validation Loss

Training Loss Plot

Interactive Plot